Lower bounds on Information Divergence
نویسندگان
چکیده
In this paper we establish lower bounds on information divergence from a distribution to certain important classes of distributions as Gaussian, exponential, Gamma, Poisson, geometric, and binomial. These lower bounds are tight and for several convergence theorems where a rate of convergence can be computed, this rate is determined by the lower bounds proved in this paper. General techniques for getting lower bounds in terms of moments are developed.
منابع مشابه
Extrinsic Jensen-Shannon Divergence: Applications to Variable-Length Coding
Abstract—This paper considers the problem of variable-length coding over a discrete memoryless channel (DMC) with noiseless feedback. The paper provides a stochastic control view of the problem whose solution is analyzed via a newly proposed symmetrized divergence, termed extrinsic Jensen–Shannon (EJS) divergence. It is shown that strictly positive lower bounds on EJS divergence provide non-asy...
متن کاملOn Lower Bounds for Statistical Learning Theory
In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are “optimal” in the sense that no other algorithm can lead to smaller statistical error. This paper...
متن کاملTopics in Statistical Theory
1 Deviation bounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1 Markov and generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 Class of sub-Gaussian random variables . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2.1 Basic properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2....
متن کاملEstimating Upper and Lower Bounds For Industry Efficiency With Unknown Technology
With a brief review of the studies on the industry in Data Envelopment Analysis (DEA) framework, the present paper proposes inner and outer technologies when only some basic information is available about the technology. Furthermore, applying Linear Programming techniques, it also determines lower and upper bounds for directional distance function (DDF) measure, overall and allocative efficienc...
متن کامل-Divergences and Related Distances
Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1102.2536 شماره
صفحات -
تاریخ انتشار 2011